Serveur d'exploration Santé et pratique musicale

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Real-Time Aural and Visual Feedback for Improving Violin Intonation.

Identifieur interne : 000483 ( Main/Exploration ); précédent : 000482; suivant : 000484

Real-Time Aural and Visual Feedback for Improving Violin Intonation.

Auteurs : Laurel S. Pardue [Royaume-Uni] ; Andrew Mcpherson [Royaume-Uni]

Source :

RBID : pubmed:31001159

Abstract

Playing with correct intonation is one of the major challenges for a string player. A player must learn how to physically reproduce a target pitch, but before that, the player must learn what correct intonation is. This requires audiation- the aural equivalent of visualization- of every note along with self-assessment whether the pitch played matches the target, and if not, what action should be taken to correct it. A challenge for successful learning is that much of it occurs during practice, typically without outside supervision. A student who has not yet learned to hear correct intonation may repeatedly practice out of tune, blithely normalizing bad habits and bad intonation. The real-time reflective nature of intonation and its consistent demand on attention make it a ripe target for technological intervention. Using a violin augmented to combine fingerboard sensors with audio analysis for real-time pitch detection, we examine the efficacy of three methods of real-time feedback for improving intonation and pitch learning. The first, aural feedback in the form of an in-tune guide pitch following the student in real-time, is inspired by the tradition of students playing along with teachers. The second is visual feedback on intonation correctness using an algorithm optimized for use throughout normal practice. The third is a combination of the two methods, simultaneously providing aural and visual feedback. Twelve beginning violinists, including children and adults, were given four in-situ 20-30 min lessons. Each lesson used one of the intonation feedback methods, along with a control lesson using no feedback. We collected data on intonation accuracy and conducted interviews on student experience and preference. The results varied by player, with evidence of some players being helped by the feedback methods but also cases where the feedback was distracting and intonation suffered. However interviews suggested a high level of interest and potential in having such tools to help during practice, and results also suggested that it takes time to learn to use the real-time aural and visual feedback. Both methods of feedback demonstrate potential for assisting self-reflection during individual practice.

DOI: 10.3389/fpsyg.2019.00627
PubMed: 31001159
PubMed Central: PMC6455216


Affiliations:


Links toward previous steps (curation, corpus...)


Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Real-Time Aural and Visual Feedback for Improving Violin Intonation.</title>
<author>
<name sortKey="Pardue, Laurel S" sort="Pardue, Laurel S" uniqKey="Pardue L" first="Laurel S" last="Pardue">Laurel S. Pardue</name>
<affiliation wicri:level="4">
<nlm:affiliation>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London, United Kingdom.</nlm:affiliation>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London</wicri:regionArea>
<placeName>
<settlement type="city">Londres</settlement>
<region type="country">Angleterre</region>
<region type="région" nuts="1">Grand Londres</region>
</placeName>
<orgName type="university">Université de Londres</orgName>
</affiliation>
</author>
<author>
<name sortKey="Mcpherson, Andrew" sort="Mcpherson, Andrew" uniqKey="Mcpherson A" first="Andrew" last="Mcpherson">Andrew Mcpherson</name>
<affiliation wicri:level="4">
<nlm:affiliation>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London, United Kingdom.</nlm:affiliation>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London</wicri:regionArea>
<placeName>
<settlement type="city">Londres</settlement>
<region type="country">Angleterre</region>
<region type="région" nuts="1">Grand Londres</region>
</placeName>
<orgName type="university">Université de Londres</orgName>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2019">2019</date>
<idno type="RBID">pubmed:31001159</idno>
<idno type="pmid">31001159</idno>
<idno type="doi">10.3389/fpsyg.2019.00627</idno>
<idno type="pmc">PMC6455216</idno>
<idno type="wicri:Area/Main/Corpus">000535</idno>
<idno type="wicri:explorRef" wicri:stream="Main" wicri:step="Corpus" wicri:corpus="PubMed">000535</idno>
<idno type="wicri:Area/Main/Curation">000535</idno>
<idno type="wicri:explorRef" wicri:stream="Main" wicri:step="Curation">000535</idno>
<idno type="wicri:Area/Main/Exploration">000535</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Real-Time Aural and Visual Feedback for Improving Violin Intonation.</title>
<author>
<name sortKey="Pardue, Laurel S" sort="Pardue, Laurel S" uniqKey="Pardue L" first="Laurel S" last="Pardue">Laurel S. Pardue</name>
<affiliation wicri:level="4">
<nlm:affiliation>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London, United Kingdom.</nlm:affiliation>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London</wicri:regionArea>
<placeName>
<settlement type="city">Londres</settlement>
<region type="country">Angleterre</region>
<region type="région" nuts="1">Grand Londres</region>
</placeName>
<orgName type="university">Université de Londres</orgName>
</affiliation>
</author>
<author>
<name sortKey="Mcpherson, Andrew" sort="Mcpherson, Andrew" uniqKey="Mcpherson A" first="Andrew" last="Mcpherson">Andrew Mcpherson</name>
<affiliation wicri:level="4">
<nlm:affiliation>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London, United Kingdom.</nlm:affiliation>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London</wicri:regionArea>
<placeName>
<settlement type="city">Londres</settlement>
<region type="country">Angleterre</region>
<region type="région" nuts="1">Grand Londres</region>
</placeName>
<orgName type="university">Université de Londres</orgName>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Frontiers in psychology</title>
<idno type="ISSN">1664-1078</idno>
<imprint>
<date when="2019" type="published">2019</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Playing with correct intonation is one of the major challenges for a string player. A player must learn how to physically reproduce a target pitch, but before that, the player must learn what correct intonation is. This requires audiation- the aural equivalent of visualization- of every note along with self-assessment whether the pitch played matches the target, and if not, what action should be taken to correct it. A challenge for successful learning is that much of it occurs during practice, typically without outside supervision. A student who has not yet learned to hear correct intonation may repeatedly practice out of tune, blithely normalizing bad habits and bad intonation. The real-time reflective nature of intonation and its consistent demand on attention make it a ripe target for technological intervention. Using a violin augmented to combine fingerboard sensors with audio analysis for real-time pitch detection, we examine the efficacy of three methods of real-time feedback for improving intonation and pitch learning. The first, aural feedback in the form of an in-tune guide pitch following the student in real-time, is inspired by the tradition of students playing along with teachers. The second is visual feedback on intonation correctness using an algorithm optimized for use throughout normal practice. The third is a combination of the two methods, simultaneously providing aural and visual feedback. Twelve beginning violinists, including children and adults, were given four
<i>in-situ</i>
20-30 min lessons. Each lesson used one of the intonation feedback methods, along with a control lesson using no feedback. We collected data on intonation accuracy and conducted interviews on student experience and preference. The results varied by player, with evidence of some players being helped by the feedback methods but also cases where the feedback was distracting and intonation suffered. However interviews suggested a high level of interest and potential in having such tools to help during practice, and results also suggested that it takes time to learn to use the real-time aural and visual feedback. Both methods of feedback demonstrate potential for assisting self-reflection during individual practice.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Status="PubMed-not-MEDLINE" Owner="NLM">
<PMID Version="1">31001159</PMID>
<DateRevised>
<Year>2020</Year>
<Month>09</Month>
<Day>29</Day>
</DateRevised>
<Article PubModel="Electronic-eCollection">
<Journal>
<ISSN IssnType="Print">1664-1078</ISSN>
<JournalIssue CitedMedium="Print">
<Volume>10</Volume>
<PubDate>
<Year>2019</Year>
</PubDate>
</JournalIssue>
<Title>Frontiers in psychology</Title>
<ISOAbbreviation>Front Psychol</ISOAbbreviation>
</Journal>
<ArticleTitle>Real-Time Aural and Visual Feedback for Improving Violin Intonation.</ArticleTitle>
<Pagination>
<MedlinePgn>627</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.3389/fpsyg.2019.00627</ELocationID>
<Abstract>
<AbstractText>Playing with correct intonation is one of the major challenges for a string player. A player must learn how to physically reproduce a target pitch, but before that, the player must learn what correct intonation is. This requires audiation- the aural equivalent of visualization- of every note along with self-assessment whether the pitch played matches the target, and if not, what action should be taken to correct it. A challenge for successful learning is that much of it occurs during practice, typically without outside supervision. A student who has not yet learned to hear correct intonation may repeatedly practice out of tune, blithely normalizing bad habits and bad intonation. The real-time reflective nature of intonation and its consistent demand on attention make it a ripe target for technological intervention. Using a violin augmented to combine fingerboard sensors with audio analysis for real-time pitch detection, we examine the efficacy of three methods of real-time feedback for improving intonation and pitch learning. The first, aural feedback in the form of an in-tune guide pitch following the student in real-time, is inspired by the tradition of students playing along with teachers. The second is visual feedback on intonation correctness using an algorithm optimized for use throughout normal practice. The third is a combination of the two methods, simultaneously providing aural and visual feedback. Twelve beginning violinists, including children and adults, were given four
<i>in-situ</i>
20-30 min lessons. Each lesson used one of the intonation feedback methods, along with a control lesson using no feedback. We collected data on intonation accuracy and conducted interviews on student experience and preference. The results varied by player, with evidence of some players being helped by the feedback methods but also cases where the feedback was distracting and intonation suffered. However interviews suggested a high level of interest and potential in having such tools to help during practice, and results also suggested that it takes time to learn to use the real-time aural and visual feedback. Both methods of feedback demonstrate potential for assisting self-reflection during individual practice.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Pardue</LastName>
<ForeName>Laurel S</ForeName>
<Initials>LS</Initials>
<AffiliationInfo>
<Affiliation>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London, United Kingdom.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>McPherson</LastName>
<ForeName>Andrew</ForeName>
<Initials>A</Initials>
<AffiliationInfo>
<Affiliation>Augmented Instruments Laboratory, Centre for Digital Music, Queen Mary University of London, Electrical Engineering & Computer Science, London, United Kingdom.</Affiliation>
</AffiliationInfo>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2019</Year>
<Month>04</Month>
<Day>02</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>Switzerland</Country>
<MedlineTA>Front Psychol</MedlineTA>
<NlmUniqueID>101550902</NlmUniqueID>
<ISSNLinking>1664-1078</ISSNLinking>
</MedlineJournalInfo>
<KeywordList Owner="NOTNLM">
<Keyword MajorTopicYN="N">aural feedback</Keyword>
<Keyword MajorTopicYN="N">intonation</Keyword>
<Keyword MajorTopicYN="N">motor learning</Keyword>
<Keyword MajorTopicYN="N">pedagogy</Keyword>
<Keyword MajorTopicYN="N">real-time feedback</Keyword>
<Keyword MajorTopicYN="N">violin</Keyword>
<Keyword MajorTopicYN="N">visual feedback</Keyword>
</KeywordList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="received">
<Year>2018</Year>
<Month>05</Month>
<Day>30</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="accepted">
<Year>2019</Year>
<Month>03</Month>
<Day>06</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2019</Year>
<Month>4</Month>
<Day>20</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2019</Year>
<Month>4</Month>
<Day>20</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2019</Year>
<Month>4</Month>
<Day>20</Day>
<Hour>6</Hour>
<Minute>1</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>epublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pubmed">31001159</ArticleId>
<ArticleId IdType="doi">10.3389/fpsyg.2019.00627</ArticleId>
<ArticleId IdType="pmc">PMC6455216</ArticleId>
</ArticleIdList>
<ReferenceList>
<Reference>
<Citation>J Acoust Soc Am. 2002 Apr;111(4):1917-30</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">12002874</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Mot Behav. 1971 Jun;3(2):111-49</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">15155169</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Hear Res. 2006 Sep;219(1-2):36-47</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">16839723</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Neuroreport. 2009 Oct 7;20(15):1392-6</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">19738497</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Front Psychol. 2014 Jan 31;5:44</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">24550867</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Front Psychol. 2018 Dec 06;9:2436</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">30574109</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Acoust Soc Am. 1996 Sep;100(3):1728-35</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">8817899</ArticleId>
</ArticleIdList>
</Reference>
</ReferenceList>
</PubmedData>
</pubmed>
<affiliations>
<list>
<country>
<li>Royaume-Uni</li>
</country>
<region>
<li>Angleterre</li>
<li>Grand Londres</li>
</region>
<settlement>
<li>Londres</li>
</settlement>
<orgName>
<li>Université de Londres</li>
</orgName>
</list>
<tree>
<country name="Royaume-Uni">
<region name="Angleterre">
<name sortKey="Pardue, Laurel S" sort="Pardue, Laurel S" uniqKey="Pardue L" first="Laurel S" last="Pardue">Laurel S. Pardue</name>
</region>
<name sortKey="Mcpherson, Andrew" sort="Mcpherson, Andrew" uniqKey="Mcpherson A" first="Andrew" last="Mcpherson">Andrew Mcpherson</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Sante/explor/SanteMusiqueV1/Data/Main/Exploration
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000483 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd -nk 000483 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Sante
   |area=    SanteMusiqueV1
   |flux=    Main
   |étape=   Exploration
   |type=    RBID
   |clé=     pubmed:31001159
   |texte=   Real-Time Aural and Visual Feedback for Improving Violin Intonation.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Main/Exploration/RBID.i   -Sk "pubmed:31001159" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd   \
       | NlmPubMed2Wicri -a SanteMusiqueV1 

Wicri

This area was generated with Dilib version V0.6.38.
Data generation: Mon Mar 8 15:23:44 2021. Site generation: Mon Mar 8 15:23:58 2021